409 research outputs found

    A Proposal for Deploying Hybrid Knowledge Bases: the ADOxx-to-GraphDB Interoperability Case

    Get PDF
    Graph Database Management Systems brought data model abstractions closer to how humans are used to handle knowledge - i.e., driven by inferences across complex relationship networks rather than by encapsulating tuples under rigid schemata. Another discipline that commonly employs graph-like structures is diagrammatic Conceptual Modeling, where intuitive, graphical means of explicating knowledge are systematically studied and formalized. Considering the common ground of graph databases, the paper proposes an integration of OWL ontologies with diagrammatic representations as enabled by the ADOxx metamodeling platform. The proposal is based on the RDF-semantics variant of OWL and leads to a particular type of hybrid knowledge bases hosted, for proof-of-concept purposes, by the GraphDB system due to its inferencing capabilities. The approach aims for complementarity and integration, providing agile diagrammatic means of creating semantic networks that are amenable to ontology-based reasoning

    Enriching Linked Data with Semantics from Domain-Specific Diagrammatic Models

    Get PDF
    One key driver of the Linked Data paradigm is the ability to lift data graphs from legacy systems by employing various adapters and RDFizers (e.g., D2RQ for relational databases, XLWrap for spreadsheets). Such approaches aim towards removing boundaries of enterprise data silos by opening them to cross-organizational linking within a “Web of Data”. An insufficiently tapped source of machine-readable semantics is the underlying graph nature of diagrammatic conceptual models – a kind of information that is richer compared to what is typically lifted from table schemata, especially when a domain-specific modeling language is employed. The paper advocates an approach to Linked Data enrichment based on a diagrammatic model RDFizer originally developed in the context of the ComVantage FP7 research project. A minimal but illustrative example is provided from which arguments will be generalized, leading to a proposed vision of “conceptual model”-aware information systems

    Semantic Bridging between Conceptual Modeling Standards and Agile Software Projects Conceptualizations

    Get PDF
    Software engineering benefitted from modeling standards (e.g. UML, BPMN), but Agile Software Project Management tends to marginalize most forms of documentation including diagrammatic modeling, focusing instead on the tracking of a project\u27s backlog and related issues. Limited means are available for annotating Jira items with diagrams, however not on a granular and semantically traceable level. Business processes tend to get lost on the way between process analysis (if any) and backlog items; UML design decisions are often disconnected from the issue tracking environment. This paper proposes domain-specific conceptual modeling to obtain a diagrammatic view on a Jira project, motivated by past conceptualizations of the agile paradigm while also offering basic interoperability with Jira to switch between environments and views. The underlying conceptualization extends conceptual modeling languages (BPMN, UML) with an agile project management perspective to enrich contextual traceability of a project\u27s elements while ensuring that data structures handled by Jira can be captured and exposed to Jira if needed. Therefore, concepts underlying the typical software development project management are integrated with established modeling concepts and tailored (with metamodeling means) for the domain-specificity of agile project management. A Design Science approach was pursued to develop a modeling method artifact, resulting in a domain-specific modeling tool for software project managers that want to augment agile practices and enrich issue annotation

    Towards a Modeling Method for Managing Node.js Projects and Dependencies

    Get PDF
    This paper proposes a domain-specific and technology-specific modeling method for managing Node.js projects. It addresses the challenge of managing dependencies in the NPM and REST ecosystems, while also providing a specialized workflow model type as a process-centric view on a software project. With the continuous growth of the Node.js environment, managing complex projects that use this technology can be chaotic, especially when it comes to planning dependencies and module integration. The deprecation of a module can lead to serious crisis regarding the projects where that module was used; consequently, traceability of deprecation propagation becomes a key requirements in Node.js project management. The modeling method introduced in this paper provides a diagrammatic solution to managing module and API dependencies in a Node.js project. It is deployed as a modeling tool that can also generate REST API documentation and Node.js project configuration files that can be executed to install the graphically designed dependencies

    User Experience Modeling Method for a Vision of Knowledge Graph-based Process Automation

    Get PDF
    This research proposes a User Experience modelling method which is an early-stage component of a research project’s vision of innovating Robotic Process Automation with the help of Knowledge Graphs and Natural Language Processing. The core idea is to integrate, in RDF graphs, a representation of the user experience and contextual information about the organization and relevant data sources for that experience. Existing RPA tools use workflow repositories that employ XML-based descriptions for both processes and UI elements. They also provide built-in workflow designers that are not tailored for design-time analysis (e.g., model queries, reporting, reasoning) but instead are just raising the abstraction level of traditional scripting – from writing code to visually connecting pre-programmed pieces of functionality. The proposal detailed in this paper makes use of Domain-Specific Modeling Language engineering to repurpose an open source BPMN implementation for describing User Experience. We rely on a metamodeling platform to ensure that the resulting diagrams are also machine-readable and take advantage of an existing plug-in to make them available as RDF graphs that can be used by other components of an automation architecture. The paper focuses on the modelling method and tool as one of the early steps of the project’s vision

    From BPMN Models to Labelled Property Graphs

    Get PDF
    There\u27s a growing interest in leveraging the structured and formal nature of business process modeling languages in order to make them available not only for human analysis but also to machine-readable knowledge representation. Standard serializations of the past were predominantly XML based, with some of them seemingly discontinued, e.g., XPDL after the dissolution of the Workflow Management Coalition. Recent research has been investigating the interplay between knowledge representation and business process modeling, with the focus typically placed on standards such as RDF and OWL. In this paper we introduce a converter that translates the standards-compliant BPMN XML format to Neo4J labelled property graphs (LPG) thus providing an alternative to both traditional XML-based serialization and to more recent experimental RDF solutions, while ensuring conceptual alignment with the standard serialization of BPMN 2.0. A demonstrator was built to highlight the benefits of having such a parser and the completeness of coverage for BPMN models. The proposal facilitates graph-based processing of business process models in a knowledge intensive context, where procedural knowledge available as BPMN diagrams must be exposed to machines and LPG-driven applications

    A Metamodeling Approach to Teaching Conceptual Modeling at Large

    Get PDF
    In the authors\u27 university there is a challenge, with respect to Conceptual Modeling topics, of bridging the gap between bachelor-level studies and research work. At bachelor-level, Conceptual Modeling is subordinated to Software Engineering topics consequently making extensive use of software design standards. However, at doctoral level or in project-based work, modeling methods must be scientifically framed within wider-scoped paradigms - Design Science, Enterprise Modeling etc. In order to bridge this gap, we developed a teaching artifact to present Conceptual Modeling as a standalone discipline that can produce its own artifacts, driven by requirements in a variety of domains. The teaching artifact is an agile modeling method that is iteratively implemented by students. The key takeaway revelation for students is that a modeling language is a knowledge schema that can be tailored and migrated for specific purposes just like a database schema, to accommodate an application domain and its modeling requirements

    Marrying Big Data with Smart Data in Sensor Stream Processing

    Get PDF
    Widespread deployments of spatially distributed sensors are continuously generating data that require advanced analytical processing and interpretation by machines. Devising machine-interpretable descriptions of sensor data is a key issue in building a semantic stream processing engine. This paper proposes a semantic sensor stream processing pipeline using Apache Kafka to publish and subscribe semantic data streams in a scalable way. We use the Kafka Consumer API to annotate the sensor data using the Semantic Sensor Network ontology, then store the annotated output in an RDF triplestore for further reasoning or semantic integration with legacy information systems. We follow a Design Science approach addressing a Smart Airport scenario with geolocated audio sensors to evaluate the viability of the proposed pipeline under various Kafka-based configurations. Our experimental evaluations show that the multi-broker Kafka cluster setup supports read scalability thus facilitating the parallelization of the semantic enrichment of the sensor data

    An Open Platform for Modeling Method Conceptualization: The OMiLAB Digital Ecosystem

    Get PDF
    This paper motivates, describes, demonstrates in use, and evaluates the Open Models Laboratory (OMiLAB)—an open digital ecosystem designed to help one conceptualize and operationalize conceptual modeling methods. The OMiLAB ecosystem, which a generalized understanding of “model value” motivates, targets research and education stakeholders who fulfill various roles in a modeling method\u27s lifecycle. While we have many reports on novel modeling methods and tools for various domains, we lack knowledge on conceptualizing such methods via a full-fledged dedicated open ecosystem and a methodology that facilitates entry points for novices and an open innovation space for experienced stakeholders. This gap continues due to the lack of an open process and platform for 1) conducting research in the field of modeling method design, 2) developing agile modeling tools and model-driven digital products, and 3) experimenting with and disseminating such methods and related prototypes. OMiLAB incorporates principles, practices, procedures, tools, and services required to address the issues above since it focuses on being the operational deployment for a conceptualization and operationalization process built on several pillars: 1) a granularly defined “modeling method” concept whose building blocks one can customize for the domain of choice, 2) an “agile modeling method engineering” framework that helps one quickly prototype modeling tools, 3) a model-aware “digital product design lab”, and 4) dissemination channels for reaching a global community. In this paper, we demonstrate and evaluate the OMiLAB in research with two selected application cases for domain- and case-specific requirements. Besides these exemplary cases, OMiLAB has proven to effectively satisfy requirements that almost 50 modeling methods raise and, thus, to support researchers in designing novel modeling methods, developing tools, and disseminating outcomes. We also measured OMiLAB’s educational impact

    Penilaian Kinerja Keuangan Koperasi di Kabupaten Pelalawan

    Full text link
    This paper describe development and financial performance of cooperative in District Pelalawan among 2007 - 2008. Studies on primary and secondary cooperative in 12 sub-districts. Method in this stady use performance measuring of productivity, efficiency, growth, liquidity, and solvability of cooperative. Productivity of cooperative in Pelalawan was highly but efficiency still low. Profit and income were highly, even liquidity of cooperative very high, and solvability was good
    corecore